A Leave-K-Out Cross-Validation Scheme for Unsupervised Kernel Regression
نویسندگان
چکیده
We show how to employ leave-K-out cross-validation in Unsupervised Kernel Regression, a recent method for learning of nonlinear manifolds. We thereby generalize an already present regularization method, yielding more flexibility without additional computational cost. We demonstrate our method on both toy and real data.
منابع مشابه
Evolutionary kernel density regression
The Nadaraya–Watson estimator, also known as kernel regression, is a density-based regression technique. It weights output values with the relative densities in input space. The density is measured with kernel functions that depend on bandwidth parameters. In this work we present an evolutionary bandwidth optimizer for kernel regression. The approach is based on a robust loss function, leave-on...
متن کاملOptimally regularised kernel Fisher discriminant classification
Mika, Rätsch, Weston, Schölkopf and Müller [Mika, S., Rätsch, G., Weston, J., Schölkopf, B., & Müller, K.-R. (1999). Fisher discriminant analysis with kernels. In Neural networks for signal processing: Vol. IX (pp. 41-48). New York: IEEE Press] introduce a non-linear formulation of Fisher's linear discriminant, based on the now familiar "kernel trick", demonstrating state-of-the-art performance...
متن کاملE cient leave-one-out cross-validation of kernel Fisher discriminant classi'ers
Mika et al. (in: Neural Network for Signal Processing, Vol. IX, IEEE Press, New York, 1999; pp. 41–48) apply the “kernel trick” to obtain a non-linear variant of Fisher’s linear discriminant analysis method, demonstrating state-of-the-art performance on a range of benchmark data sets. We show that leave-one-out cross-validation of kernel Fisher discriminant classi'ers can be implemented with a ...
متن کاملCancerous Tissue Classification Using Microarray Gene Expression
In this project, we apply machine learning techniques to perform tumor vs. normal tissue classification using gene expression microarray data, which was proven to be useful for early-stage cancer diagnosis and cancer subtype identification. We compare the results of both supervised learning (k-nearest-neighbors, SVMs, boosting) and unsupervised learning (k-means clustering, hierarchical cluster...
متن کاملEfficient leave-one-out cross-validation of kernel fisher discriminant classifiers
Mika et al. [1] apply the “kernel trick” to obtain a non-linear variant of Fisher’s linear discriminant analysis method, demonstrating state-of-the-art performance on a range of benchmark datasets. We show that leave-one-out cross-validation of kernel Fisher discriminant classifiers can be implemented with a computational complexity of only O(l3) operations rather than the O(l4) of a näıve impl...
متن کامل